Collaborative Multi-output Gaussian Processes

نویسندگان

  • Trung V. Nguyen
  • Edwin V. Bonilla
چکیده

We introduce the collaborative multi-output Gaussian process (GP) model for learning dependent tasks with very large datasets. The model fosters task correlations by mixing sparse processes and sharing multiple sets of inducing points. This facilitates the application of variational inference and the derivation of an evidence lower bound that decomposes across inputs and outputs. We learn all the parameters of the model in a single stochastic optimization framework that scales to a large number of observations per output and a large number of outputs. We demonstrate our approach on a toy problem, two medium-sized datasets and a large dataset. The model achieves superior performance compared to single output learning and previous multi-output GP models, confirming the benefits of correlating sparsity structure of the outputs via the inducing points.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Multi-class Classification with Dependent Gaussian Processes

We present a novel multi-output Gaussian process model for multi-class classification. We build on the formulation of Gaussian processes via convolution of white Gaussian noise processes with a parameterized kernel and present a new class of multi-output covariance functions. The latter allow for greater flexibility in modelling relationships between outputs while being parsimonious with regard...

متن کامل

Application of Non-Linear Functions at Distribution of Output SINR Gaussian Interference Channels

We have examined the convergence behavior of the LSCMA in some simple environments. Algorithms such as Multi¬ Target CMA, Multistage CMA, and Iterative Least Squares with Projection can be used for this purpose. The results presented here can form a basis for analysis of these multi-signal extraction techniques. Clearly, the variance and distribution of output SINR obtained with the LSCMA is al...

متن کامل

Collaborative Gaussian Processes for Preference Learning

We present a new model based on Gaussian processes (GPs) for learning pairwise preferences expressed by multiple users. Inference is simplified by using a preference kernel for GPs which allows us to combine supervised GP learning of user preferences with unsupervised dimensionality reduction for multi-user systems. The model not only exploits collaborative information from the shared structure...

متن کامل

mlegp: an R package for Gaussian process modeling and sensitivity analysis

Gaussian processes (GPs) are commonly used as surrogate statistical models for predicting output of computer experiments (Santner et al., 2003). Generally, GPs are both interpolators and smoothers of data and are effective predictors when the response surface of interest is a smooth function of the parameter space. The package mlegp finds maximum likelihood estimates of Gaussian processes for u...

متن کامل

Sparse Convolved Gaussian Processes for Multi-output Regression

We present a sparse approximation approach for dependent output Gaussian processes (GP). Employing a latent function framework, we apply the convolution process formalism to establish dependencies between output variables, where each latent function is represented as a GP. Based on these latent functions, we establish an approximation scheme using a conditional independence assumption between t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014